Comments on a Monte Carlo approach to the analysis of functional neuroimaging data.
نویسنده
چکیده
Functional neuroimaging is probably always going to be methodologically pluralistic. There are several reasons for this. For example, brain functions or processes can be characterized at different levels and scales, and it may be the case that there is no fundamental processing level but that different phenomena are optimally described at different scales or levels. Methods developed and validated at specific spatiotemporal scales and for certain parameter ranges (e.g., degrees of freedom, amount of filtering) may not be applicable at other spatiotemporal scales or parameter ranges. Furthermore, the rapid development in fMRI and MEG/EEG illustrates the need for descriptive, exploratory, and inferential methods. Descriptive and exploratory methods are useful in characterizing the nature of the signal that is present in the data, while inferential methods are used to test hypotheses and to determine confidence intervals. Basically, there are three inferential approaches to the analysis of functional imaging data: theoretical parametric (e.g., Friston et al., 1995; Worsley et al., 1992, 1996) approaches, nonparametric approaches (e.g., Holmes et al., 1996), and Monte Carlo or simulation approaches (e.g., Forman et al., 1995; Poline and Mazoyer, 1993). These approaches differ in the assumptions made about the properties of data and the approximations used in their statistical analyses. What is of importance is not the number of assumptions or the characteristics of the approximations made, but how well these assumptions and approximations are fulfilled by empirical data and the robustness of the method if the assumptions are not fully met. This notion emphasizes the importance of empirical validation and explicit characterization of the inherent limitations of a given method. Progress in and the credibility of a scientific field are critically dependent on the long-term consistency and convergence of empirical results. Discussion and critical evaluation of the methods used in a given scientific field are of vital importance in this process. An example of such a critical evaluation and discussion is summarized below. Recently, functional neuroimaging studies have been published in Nature (Geyer et al., 1996), Science (Kinomura et al., 1996), and the Proceedings of the National Academy of Sciences of the USA (Roland et al., 1998) using a cluster analysis method described by Roland et al. (1993). This method has been criticized by Frackowiak et al. (1996) and subsequently defended by Roland and Gulyas (1996). In this issue of NeuroImage, Roland and colleagues (Ledberg et al., 1998) return to some of the issues previously raised. Ledberg et al. (1998) describe a revised version of the Roland et al. (1993) method acknowledging the critique of Frackowiak et al. (1996). This illustrates the importance of proper empirical validation of any proposed method before it is accepted and applied to experimental data. The constructive result of this critical evaluation is the significant improvement of the method (Ledberg et al., 1998). The reason for the closer examination of the Roland et al. (1993) method and the consequent discussion in the European Journal of Neuroscience was diverging results and interpretations of data relating to the functional neuroanatomy of vision, in particular color perception (Frackowiak et al., 1996). Two general topics are at issue. The first relates to the functional anatomy of vision, which is not discussed in Ledberg et al. (1998) and will not be discussed here. The second issue relates to methodology and is independent of the first, whereas conclusions about the functional anatomy of vision are most certainly dependent on the method used. A Monte Carlo approach to the analysis of PET data using cluster size as the test statistic was proposed by Poline and Mazoyer (1993). A similar approach has been applied to the analysis of fMRI data (Forman et al., 1995). In general, Monte Carlo approaches are critically dependent on adequately characterizing the image noise, using sufficient numbers of simulated realizations (since the tails of the observed probability NEUROIMAGE 8, 108–112 (1998) ARTICLE NO. NI980375
منابع مشابه
A Bayesian Networks Approach to Reliability Analysis of a Launch Vehicle Liquid Propellant Engine
This paper presents an extension of Bayesian networks (BN) applied to reliability analysis of an open gas generator cycle Liquid propellant engine (OGLE) of launch vehicles. There are several methods for system reliability analysis such as RBD, FTA, FMEA, Markov Chains, and etc. But for complex systems such as LV, they are not all efficiently applicable due to failure dependencies between compo...
متن کاملSpatial count models on the number of unhealthy days in Tehran
Spatial count data is usually found in most sciences such as environmental science, meteorology, geology and medicine. Spatial generalized linear models based on poisson (poisson-lognormal spatial model) and binomial (binomial-logitnormal spatial model) distributions are often used to analyze discrete count data in which spatial correlation is observed. The likelihood function of these models i...
متن کاملA New Approach for Monte Carlo Simulation of RAFT Polymerization
In this work, based on experimental observations and exact theoretical predictions, the kinetic scheme of RAFT polymerization is extended to a wider range of reactions such as irreversible intermediate radical terminations and reversible transfer reactions. The reactions which have been labeled as kinetic scheme are the more probable existing reactions as the theoretical point of view. The ...
متن کاملSensitivity and uncertainty analysis of sediment rating equation coefficients using the Monte-Carlo simulation (Case study: Zoshk-Abardeh watershed, Shandiz)
The sediment load estimation is essential for watershed management and soil conservation strategies. The sediment rating curve is the most common approach for estimating the sediment load when the observed sediment records are not available. With regard to the measurement errors and the limitation of available data, the sediment rating curve has a degree of uncertainty which should be accounted...
متن کاملThe Use of Monte-Carlo Simulations in Seismic Hazard Analysis in Tehran and Surrounding Areas
Probabilistic seismic hazard analysis is a technique for estimating the annual rate of exceedance of a specified ground motion at a site due to the known and suspected earthquake sources. A Monte-Carlo approach is utilized to estimate the seismic hazard at a site. This method uses numerous resampling of an earthquake catalog to construct synthetic catalogs to evaluate the ground motion hazard a...
متن کاملA Monte Carlo-Based Search Strategy for Dimensionality Reduction in Performance Tuning Parameters
Redundant and irrelevant features in high dimensional data increase the complexity in underlying mathematical models. It is necessary to conduct pre-processing steps that search for the most relevant features in order to reduce the dimensionality of the data. This study made use of a meta-heuristic search approach which uses lightweight random simulations to balance between the exploitation of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- NeuroImage
دوره 8 2 شماره
صفحات -
تاریخ انتشار 1998